67 research outputs found

    Concentration of the Kirchhoff index for Erdos-Renyi graphs

    Full text link
    Given an undirected graph, the resistance distance between two nodes is the resistance one would measure between these two nodes in an electrical network if edges were resistors. Summing these distances over all pairs of nodes yields the so-called Kirchhoff index of the graph, which measures its overall connectivity. In this work, we consider Erdos-Renyi random graphs. Since the graphs are random, their Kirchhoff indices are random variables. We give formulas for the expected value of the Kirchhoff index and show it concentrates around its expectation. We achieve this by studying the trace of the pseudoinverse of the Laplacian of Erdos-Renyi graphs. For synchronization (a class of estimation problems on graphs) our results imply that acquiring pairwise measurements uniformly at random is a good strategy, even if only a vanishing proportion of the measurements can be acquired

    The Spectrum of Random Inner-product Kernel Matrices

    Full text link
    We consider n-by-n matrices whose (i, j)-th entry is f(X_i^T X_j), where X_1, ...,X_n are i.i.d. standard Gaussian random vectors in R^p, and f is a real-valued function. The eigenvalue distribution of these random kernel matrices is studied at the "large p, large n" regime. It is shown that, when p and n go to infinity, p/n = \gamma which is a constant, and f is properly scaled so that Var(f(X_i^T X_j)) is O(p^{-1}), the spectral density converges weakly to a limiting density on R. The limiting density is dictated by a cubic equation involving its Stieltjes transform. While for smooth kernel functions the limiting spectral density has been previously shown to be the Marcenko-Pastur distribution, our analysis is applicable to non-smooth kernel functions, resulting in a new family of limiting densities

    Spectral Embedding Norm: Looking Deep into the Spectrum of the Graph Laplacian

    Full text link
    The extraction of clusters from a dataset which includes multiple clusters and a significant background component is a non-trivial task of practical importance. In image analysis this manifests for example in anomaly detection and target detection. The traditional spectral clustering algorithm, which relies on the leading KK eigenvectors to detect KK clusters, fails in such cases. In this paper we propose the {\it spectral embedding norm} which sums the squared values of the first II normalized eigenvectors, where II can be significantly larger than KK. We prove that this quantity can be used to separate clusters from the background in unbalanced settings, including extreme cases such as outlier detection. The performance of the algorithm is not sensitive to the choice of II, and we demonstrate its application on synthetic and real-world remote sensing and neuroimaging datasets
    corecore